# Instruction Fine-Tuning
Qwen2.5 Recursive Coder 14B Instruct
Apache-2.0
A 14B-parameter code generation and comprehension model based on the Qwen2.5 architecture, integrated through the Model Stock method by combining multiple specialized coding models
Large Language Model
Transformers

Q
spacematt
39
2
Qwen2.5 CompositeFlow Coder 14B Instruct
Apache-2.0
A hybrid model based on Qwen2.5-Coder-14B-Instruct, merged using the mergekit tool with multiple specialized coding models
Large Language Model
Transformers

Q
spacematt
31
3
Gemma 3 12b It GGUF
Gemma-3-12b-it is a large language model developed by Google, based on the transformer architecture, focusing on text generation tasks.
Large Language Model
G
second-state
583
1
Jais Family 30b 16k Chat
Apache-2.0
The Jais series is a bilingual large language model optimized for Arabic, while also possessing strong English capabilities. The 30B-16K version has 30 billion parameters and supports a context length of 16,384 tokens.
Large Language Model
Safetensors Supports Multiple Languages
J
inceptionai
59
12
Gte Qwen2 1.5B Instruct
Apache-2.0
A general-purpose text embedding model based on Qwen2-1.5B, supporting multilingual and long-text processing
Text Embedding
Transformers

G
Alibaba-NLP
242.12k
207
Codegen25 7b Instruct P
Other
CodeGen2.5 is a series of autoregressive language models for program synthesis, improved upon CodeGen2 and trained on StarCoderData, achieving performance comparable to larger models at a smaller scale.
Large Language Model
Transformers Other

C
Salesforce
61
36
Metharme 1.3b
Apache-2.0
An instruction fine-tuned model developed based on the deduplicated version of Pythia 1.4B, specializing in novel creation and dialogue generation
Large Language Model
Transformers English

M
PygmalionAI
133
24
Featured Recommended AI Models